Sufficient Dimension Reduction With Missing Predictors

نویسندگان

  • Lexin LI
  • Wenbin LU
چکیده

In high-dimensional data analysis, sufficient dimension reduction (SDR) methods are effective in reducing the predictor dimension, while retaining full regression information and imposing no parametric models. However, it is common in high-dimensional data that a subset of predictors may have missing observations. Existing SDR methods resort to the complete-case analysis by removing all the subjects with missingness in any of the predictors under inquiry. Such an approach does not make effective use of the data and is valid only when missingness is independent of both observed and unobserved quantities. In this article, we propose a new class of SDR estimators under a more general missingness mechanism that allows missingness to depend on the observed data. We focus on a widely used SDR method, sliced inverse regression, and propose an augmented inverse probability weighted sliced inverse regression estimator (AIPW–SIR). We show that AIPW–SIR is doubly robust and asymptotically consistent and demonstrate that AIPW–SIR is more effective than the completecase analysis through both simulations and real data analysis. We also outline the extension of the AIPW strategy to other SDR methods, including sliced average variance estimation and principal Hessian directions.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Sufficient dimension reduction via bayesian mixture modeling.

Dimension reduction is central to an analysis of data with many predictors. Sufficient dimension reduction aims to identify the smallest possible number of linear combinations of the predictors, called the sufficient predictors, that retain all of the information in the predictors about the response distribution. In this article, we propose a Bayesian solution for sufficient dimension reduction...

متن کامل

Estimating Sufficient Reductions of the Predictors in Abundant High-dimensional Regressions by R. Dennis Cook1, Liliana Forzani

We study the asymptotic behavior of a class of methods for sufficient dimension reduction in high-dimension regressions, as the sample size and number of predictors grow in various alignments. It is demonstrated that these methods are consistent in a variety of settings, particularly in abundant regressions where most predictors contribute some information on the response, and oracle rates are ...

متن کامل

Testing Predictor Contributions in Sufficient Dimension Reduction

We develop tests of the hypothesis of no effect for selected predictors in regression, without assuming a model for the conditional distribution of the response given the predictors. Predictor effects need not be limited to the mean function and smoothing is not required. The general approach is based on sufficient dimension reduction, the idea being to replace the predictor vector with a lower...

متن کامل

On Partial Sufficient Dimension Reduction with Applications to Partially Linear Multi-index Models

Partial dimension reduction is a general method to seek informative convex combinations of predictors of primary interest, which includes dimension reduction as its special case when the predictors in the remaining part are constants. In this paper, we propose a novel method to conduct partial dimension reduction estimation for predictors of primary interest without assuming that the remaining ...

متن کامل

Asymptotic Properties of Sufficient Dimension Reduction with a Diverging Number of Predictors.

We investigate asymptotic properties of a family of sufficient dimension reduction estimators when the number of predictors p diverges to infinity with the sample size. We adopt a general formulation of dimension reduction estimation through least squares regression of a set of transformations of the response. This formulation allows us to establish the consistency of reduction projection estim...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2008